959 research outputs found

    Flow behavior in liquid molding

    Get PDF
    The liquid molding (LM) process for manufacturing polymer composites with structural properties has the potential to significantly lower fabrication costs and increase production rates. LM includes both resin transfer molding and structural reaction injection molding. To achieve this potential, however, the underlying science base must be improved to facilitate effective process optimization and implementation of on-line process control. The National Institute of Standards and Technology (NIST) has a major program in LM that includes materials characterization, process simulation models, on-line process monitoring and control, and the fabrication of test specimens. The results of this program are applied to real parts through cooperative projects with industry. The key feature in the effort is a comprehensive and integrated approach to the processing science aspects of LM. This paper briefly outlines the NIST program and uses several examples to illustrate the work

    Looking at the Schizophrenia Spectrum Through the Prism of Self-disorders: An Empirical Study

    Get PDF
    Nonpsychotic anomalies of subjective experience were emphasized in both classic literature and phenomenological psychiatry as essential clinical features of schizophrenia. However, only in recent years, their topicality with respect to the construct validity of the concept of the schizophrenia spectrum has been explicitly acknowledged, mainly as a consequence of the increasing focus on early detection and prevention of psychosis. The current study tested the hypothesis of a specific aggregation of self-disorders (SDs, various anomalies of self-awareness) in schizophrenia-spectrum conditions, comparing different diagnostic groups; 305 subjects, previously assessed in the Copenhagen Schizophrenia Linkage Study, were grouped into 4 experimental samples, according to their Diagnostic and Statistical Manual of Mental Disorders (Third Edition Revised) main diagnosis: schizophrenia, (n = 29), schizotypal personality disorder (n = 61), other mental illness not belonging to the schizophrenia spectrum (n = 112), and no mental illness (n = 103). The effect of diagnostic grouping on the level of SDs was explored via general linear model and logistic regression. The diagnosis of schizophrenia and schizotypy predicted higher levels of SDs, and SDs scores were significantly different between spectrum and nonspectrum samples; the likelihood of experiencing SDs increased as well with the diagnostic severity. The findings support the assumption that SDs are a discriminant psychopathological feature of the schizophrenia spectrum and suggest their incorporation to strengthen its construct validity, with potential benefit for both early detection and pathogenetic research

    Successful software engineering research

    Full text link

    Punctuated Equilibrium in Software Evolution

    Full text link
    The approach based on paradigm of self-organized criticality proposed for experimental investigation and theoretical modelling of software evolution. The dynamics of modifications studied for three free, open source programs Mozilla, Free-BSD and Emacs using the data from version control systems. Scaling laws typical for the self-organization criticality found. The model of software evolution presenting the natural selection principle is proposed. The results of numerical and analytical investigation of the model are presented. They are in a good agreement with the data collected for the real-world software.Comment: 4 pages, LaTeX, 2 Postscript figure

    On the Effective Manipulation of Digital Objects: A Prototype-Based Instantiation Approach

    Full text link
    Abstract. This paper elaborates on the design and development of an effective digital object manipulation mechanism that facilitates the gen-eration of configurable Digital Library application logic, as expressed by collection manager, cataloguing and browsing modules. Our work aims to resolve the issue that digital objects typing information can be cur-rently utilized only by humans as a guide and not by programs as a digital object type conformance mechanism. Drawing on the notions of the Object Oriented Model, we propose a “type checking ” mechanism that automates the conformance of digital objects to their type defini-tions, named digital object prototypes. We pinpoint the practical benefits gained by our approach in the development of the University of Athens Digital Library, in terms of code reuse and configuration capabilities.

    Dynamic Approximate Vertex Cover and Maximum Matching

    Get PDF
    We consider the problem of maintaining a large matching or a small vertex cover in a dynamically changing graph. Each update to the graph is either an edge deletion or an edge insertion. We give the first randomized data structure that simultaneously achieves a constant approximation factor and handles a sequence of k updates in k. polylog(n) time. Previous data structures require a polynomial amount of computation per update. The starting point of our construction is a distributed algorithm of Parnas and Ron (Theor. Comput. Sci. 2007), which they designed for their sublinear-time approximation algorithm for the vertex cover size. This leads us to wonder whether there are other connections between sublinear algorithms and dynamic data structures.National Science Foundation (U.S.) (Grant 0732334)National Science Foundation (U.S.) (Grant 0728645)Marie Curie International (Reintegration Grant PIRG03-GA-2008-231077)Israel Science Foundation (Grant 1147/09)Israel Science Foundation (Grant 1675/09

    Acceptance Criteria for Critical Software Based on Testability Estimates and Test Results

    Get PDF
    Testability is defined as the probability that a program will fail a test, conditional on the program containing some fault. In this paper, we show that statements about the testability of a program can be more simply described in terms of assumptions on the probability distribution of the failure intensity of the program. We can thus state general acceptance conditions in clear mathematical terms using Bayesian inference. We develop two scenarios, one for software for which the reliability requirements are that the software must be completely fault-free, and another for requirements stated as an upper bound on the acceptable failure probability

    Coordination Implications of Software Coupling in Open Source Projects

    Get PDF
    The effect of software coupling on the quality of software has been studied quite widely since the seminal paper on software modularity by Parnas [1]. However, the effect of the increase in software coupling on the coordination of the developers has not been researched as much. In commercial software development environments there normally are coordination mechanisms in place to manage the coordination requirements due to software dependencies. But, in the case of Open Source software such coordination mechanisms are harder to implement, as the developers tend to rely solely on electronic means of communication. Hence, an understanding of the changing coordination requirements is essential to the management of an Open Source project. In this paper we study the effect of changes in software coupling on the coordination requirements in a case study of a popular Open Source project called JBoss

    Sublinear-Time Algorithms for Monomer-Dimer Systems on Bounded Degree Graphs

    Full text link
    For a graph GG, let Z(G,λ)Z(G,\lambda) be the partition function of the monomer-dimer system defined by kmk(G)λk\sum_k m_k(G)\lambda^k, where mk(G)m_k(G) is the number of matchings of size kk in GG. We consider graphs of bounded degree and develop a sublinear-time algorithm for estimating logZ(G,λ)\log Z(G,\lambda) at an arbitrary value λ>0\lambda>0 within additive error ϵn\epsilon n with high probability. The query complexity of our algorithm does not depend on the size of GG and is polynomial in 1/ϵ1/\epsilon, and we also provide a lower bound quadratic in 1/ϵ1/\epsilon for this problem. This is the first analysis of a sublinear-time approximation algorithm for a # P-complete problem. Our approach is based on the correlation decay of the Gibbs distribution associated with Z(G,λ)Z(G,\lambda). We show that our algorithm approximates the probability for a vertex to be covered by a matching, sampled according to this Gibbs distribution, in a near-optimal sublinear time. We extend our results to approximate the average size and the entropy of such a matching within an additive error with high probability, where again the query complexity is polynomial in 1/ϵ1/\epsilon and the lower bound is quadratic in 1/ϵ1/\epsilon. Our algorithms are simple to implement and of practical use when dealing with massive datasets. Our results extend to other systems where the correlation decay is known to hold as for the independent set problem up to the critical activity

    A framework for the simulation of structural software evolution

    Get PDF
    This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2008 ACM.As functionality is added to an aging piece of software, its original design and structure will tend to erode. This can lead to high coupling, low cohesion and other undesirable effects associated with spaghetti architectures. The underlying forces that cause such degradation have been the subject of much research. However, progress in this field is slow, as its complexity makes it difficult to isolate the causal flows leading to these effects. This is further complicated by the difficulty of generating enough empirical data, in sufficient quantity, and attributing such data to specific points in the causal chain. This article describes a framework for simulating the structural evolution of software. A complete simulation model is built by incrementally adding modules to the framework, each of which contributes an individual evolutionary effect. These effects are then combined to form a multifaceted simulation that evolves a fictitious code base in a manner approximating real-world behavior. We describe the underlying principles and structures of our framework from a theoretical and user perspective; a validation of a simple set of evolutionary parameters is then provided and three empirical software studies generated from open-source software (OSS) are used to support claims and generated results. The research illustrates how simulation can be used to investigate a complex and under-researched area of the development cycle. It also shows the value of incorporating certain human traits into a simulation—factors that, in real-world system development, can significantly influence evolutionary structures
    corecore